Exascale computing and big data
نویسندگان
چکیده
منابع مشابه
Exascale Computing and Big Data: The Next Frontier
For scientific and engineering computing, exascale (10 operations per second) is the next proxy in the long trajectory of exponential performance increases that has continued for over half a century. Similarly, large-scale data preservation and sustainability within and across disciplines, metadata creation and multidisciplinary fusion, and digital privacy and security define the frontiers of b...
متن کاملScientific Discovery and Engineering Innovation Requires Unifying Traditionally Separated High- Performance Computing and Big Data Analytics. Exascale Computing and Big Data
NEARLY TWO CENTURIES ago, the English chemist Humphrey Davy wrote “Nothing tends so much to the advancement of knowledge as the application of a new instrument. The native intellectual powers of men in different times are not so much the causes of the different success of their labors, as the peculiar nature of the means and artificial resources in their possession.” Davy’s observation that adv...
متن کاملNew Execution Models are Required for Big Data at Exascale
Computing on Big Data involves algorithms whose performance characteristics are fundamentally different from those of traditional scientific computing applications. Supercomputers are programmed today using execution models, such as CSP (and its primary realization, MPI), that are designed and optimized for traditional applications, but those models have weaknesses when applied to Big Data appl...
متن کاملBig Data and Fog Computing
Fog computing serves as a computing layer that sits between the edge devices and the cloud in the network topology. They have more compute capacity than the edge but much less so than cloud data centers. They typically have high uptime and always-on Internet connectivity. Applications that make use of the fog can avoid the network performance limitation of cloud computing while being less resou...
متن کاملThe Exascale Computing Project
E xascale computing—computing capability that can achieve at least a billion billion operations per second, 50 to 100 times more powerful than the nation’s fastest supercomputers in use today—is the next waypoint in high-performance computing (HPC). Achieving such speeds isn’t an easy task. To be useful to a wide spectrum of applications, in addition to peak speed, supercomputers need to have l...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications of the ACM
سال: 2015
ISSN: 0001-0782,1557-7317
DOI: 10.1145/2699414